sultana mow's profile

Aiming at the ecology and calculating

Aiming at the ecology and calculating long-term profits?


In the game of the AI ​​innovation cycle, "developer" is the keyword on the lips of the big guys.
Tang Daosheng of Tencent once said at the Technology Open Conference: "In the world of digital and real integration, developers HE Tuber are the most important 'architects'."
Zhang Yong even directly shouted in the interview: "We have a principle that we will resolutely implement, and we will do something and not do something. In other words, we will leave half of our lives to our partners."


The reason why we attach so much importance to making friends is that the current

 competition in the field of large models has shifted from "fighting alone" to "fighting in groups," which is also an ecological competition.
After all, Everbright model training requires hundreds of millions of computing power, and large-scale application of large models is likely to cost tens of billions of computing power.
For example, in order to provide ChatGPT with computing power guarantee, its financial backer Microsoft spent hundreds of millions of dollars and tens of thousands of NVIDIA A100 chips to build a supercomputing platform. It also deployed hundreds of thousands of GPUs in more than 60 Azure data centers. Based on ChatGPT's reasoning, the estimated cost is in excess of several billion dollars.



And the money was not enough. In April 2022, ChatGPT closed the Plus service for the C-side due to huge costs.

Such a huge investment requires a rich enough business scenario to prevent losses. However, the application scenarios that large model companies can penetrate are very limited, so they need to rely on external forces to extend their tentacles to thousands of industries.
OpenAI founder Sam Altman mentioned in January this year: The key lies in the middle layer. There is a group of new startups that adopt existing large models and adjust them. They have unique data flywheels. Over time, Continuous improvement will feed back the big model and create a lot of value.


The specific understanding is that relying on the actual application 

and debugging of many developers and large model customers will dilute the cost of manufacturer data acquisition and model fine-tuning accordingly (positioned on the basic cost of large models), and after the product is enriched, customers can Cloud products can be accessed and used at any time, improving product development and sales efficiency.
This has been reflected in previous overseas cloud competitions. Take AWS as an example. It has been "calling friends" since 2013.


With the support of partners, AWS launched 1,957 new features and services in 

 alone, 90%-95% of which were derived from customer feedback, covering infrastructure products, vertical industry solutions and other fields.
AWS also reciprocates by providing partners with a sales channel Marketplace (which can be understood as the APPstore for cloud products). As of May this year, it has accumulated 2 million subscribers, allowing its products to quickly reach millions of users around the world.


It can be seen that developers’ transaction scale on the Marketplace can be expanded by 80% and transaction speed can be increased by 40%. Attracted by this, more and more developers are eager to get started.



By relying on the product ecosystem to expand sales and attracting developers to further settle in through the sales volume, Marketplace has become a cloud-based "APP Store" that earns more than one billion US dollars annually.
Not only is it about making cash, but as the ecological rotation becomes more and more mature and developers take root deeper and deeper, the giants will have more benefits to gain.


Back to the present, OpenAI seems to want to take this path.


As of January 2023, it has formed partnerships with 902 companies in technology, education, manufacturing, finance, retail and other industries, and it was recently disclosed that it is considering creating an application store for customers to sell customized AI models to companies, " The trilogy of "technology-product-ecology" is gradually taking shape.
Compared with overseas, domestic cloud manufacturers that started late have a more prominent problem of missing links in the ecological chain, and their needs in this regard are more urgent.


It can be seen that as of 2022, the penetration rate of domestic low-code software in the enterprise software market is less than 1%, and among companies that have a certain understanding of low-code, less than 10% have tried or have implemented it.
In the absence of a large-scale ecological foundation, if cloud vendors want to penetrate applications into various vertical segments, they have to rely on self-research or external procurement - for example, Alibaba Cloud is the former, and Tencent before this year was the latter. But no matter which one it is, it's a bit thankless.


However, now this industry structure seems to be expected to change with the trend of AI cloud migration.
In this round of AI cloud migration trends, as mentioned above, the cost pressure on cloud companies has doubled. They also need to "borrow power" to spread their expenses, so as to travel light and race against time.


Under this background, cloud-based companies have further increased their dependence on the ecosystem, which is equivalent to giving the giants with large models the opportunity to truly start the ecosystem and realize the "technology-product-developer" rotation like OpenAI. Great opportunity.
And with the enrichment of the ecosystem, customers have more freedom to choose products, which can also increase pricing flexibility to a certain extent.
For example, Alibaba Cloud once mentioned: "All platform-level products will be cloud-based and ready to use out of the box. Cloud services will no longer be paid in terms of resources, but in terms of business results."


When the ecosystem matures, domestic cloud ma

nufacturers can refer to the experiences of Amazon and Microsoft to "monetize" to boost revenue.
Not only that, once the large model ecosystem is operational, it may be able to reshape users' usage habits in favor of private clouds. After all, if enterprises want to access AI efficiently and quickly, they must first let go of their "obsession" with customized secondary development.
Take the face recognition system in the campus of a large factory as an example. It supports face recognition of 100,000 IDs. Each training needs to process about 5 million photos. It is difficult for a single company to absorb such large-scale data, and collaborative development by multiple companies in the public ecosystem is more effective.


For cloud vendors, the reduction in the proportion of private clouds can save real money. Just refer to Kingsoft Cloud. Previously, with the growth of private cloud business, the "solution development and service costs" also increased rapidly.

With more revenue and less expenses, cloud vendors will naturally have more room for maneuver in their income statements. Although the price reduction will put them under pressure in the short term, looking at the sweetness in the future, it may be a good deal.

The so-called new round of price war is essentially a new round of cloud computing war.
The key this time is how to seize the opportunity of generative AI and seize the market; at the same time, establish an ecological rotation and seek greater scale and profit margins.
In fact, the current price war may be just the beginning. After all, standing at such a crossroads, no one can be immune.
Aiming at the ecology and calculating
Published:

Aiming at the ecology and calculating

Published: